2,849 research outputs found

    Trade-Revealed TFP

    Get PDF
    We introduce a novel methodology to measure the relative TFP of the tradeable sector across countries, based on the relationship between trade and TFP in the model of Eaton and Kortum (2002). The logic of our approach is to measure TFP not from its "primitive" (the production function) but from its observed implications. In particular, we estimate TFPs as the productivities that best fit data on trade, production, and wages. Applying this methodology to a sample of 19 OECD countries, we estimate the TFP of each country's manufacturing sector from 1985 to 2002. Our measures are easy to compute and, with respect to the standard development-accounting approach, are no longer mere residuals. Nor do they yield common "anomalies", such as the higher TFP of Italy relative to the US.Multi-factor productivity, TFP measurement, Eaton-Kortum model

    Ricardian selection

    Get PDF
    We analyze the foundations of the relationship between trade and total factor productivity (TFP) in the Ricardian model. Under general assumptions about the autarky distributions of industry productivities, trade openness raises TFP. This is due to the selection effect of international competition – driven by comparative advantages – which makes "some" high- and "many" low-productivity industries exit the market. We derive a model-based measure of this effect that requires only production and trade data. In a sample of 41 countries, we find that Ricardian selection raised manufacturing TFP by 11% above the autarky level in 2005 (as against 6% in 1985), with a neat positive time trend and large cross-country differences.selection effect, Eaton-Kortum model, international competition

    Ricardian selection

    Get PDF
    We analyze the foundations of the relationship between trade and TFP in the Ricardian model. Under general assumptions about the autarky distributions of industry productivities, trade openness raises TFP. This is due to the selection effect of international competition --- driven by comparative advantages --- which makes "some" high- and "many" low-productivity industries exit the market. We derive a model-based measure of this effect that requires only production and trade data. For a sample of 41 countries, we find that Ricardian selection raised manufacturing TFP by 11% above the autarky level in 2005 (6% in 1985), with a neat positive time trend and large cross-country differences.selection effect, Eaton-Kortum model, international competition

    Constraints on Modified Gravity from ACT and SPT

    Full text link
    The Atacama Cosmology Telescope (ACT) and the South Pole Telescope (SPT) have recently provided new and precise measurements of the Cosmic Microwave Background anisotropy damping tail. This region of the CMB angular spectra, thanks to the angular distortions produced by gravitational lensing, can probe the growth of matter perturbations and provide a test for general relativity. Here we make use of the ACT and SPT power spectrum measurements (combined with the recent WMAP9 data) to constrain f(R) gravity theories. Adopting a parametrized approach, we obtain an upper limit on the lengthscale of the theory of B_0 < 0.86 at 95% c.l. from ACT, while we get a significantly stronger bound from SPT with B_0 < 0.14 at 95% c.l..Comment: 6 pages, 4 figures, some sentences correcte

    A comparison of structural reform scenarios across the EU member states - Simulation-based analysis using the QUEST model with endogenous growth

    Get PDF
    This paper calibrates the Roeger-Varga-Veld (2008) micro-founded DSGE model with endogenous growth for all EU member states using country specific structural characteristics and employs the individual country models to analyse the macroeconomic impact of various structural reforms. We analyse the costs and benefits of reforms in terms of fiscal policy instruments such as taxes, benefits, subsidies and administrative costs faced by firms. We find that less R&D intensive countries would benefit the most from R&D promoting and skill-upgrading policies. We also find that shifting from labour to consumption taxes, reducing the benefit replacement rate and relieving administrative entry barriers are the most effective measures in those countries which have high labour taxes and entry barriers.Structural reforms, endogenous growth, DSGE modelling, EU member states, tax credits, tax shifts, entry barriers, human capital, D'Auria, Pagano, Ratto, Varga

    Constraining images of quadratic arboreal representations

    Get PDF
    In this paper, we prove several results on finitely generated dynamical Galois groups attached to quadratic polynomials. First we show that, over global fields, quadratic post-critically finite polynomials are precisely those having an arboreal representation whose image is topologically finitely generated. To obtain this result, we also prove the quadratic case of Hindes' conjecture on dynamical non-isotriviality. Next, we give two applications of this result. On the one hand, we prove that quadratic polynomials over global fields with abelian dynamical Galois group are necessarily post-critically finite, and we combine our results with local class field theory to classify quadratic pairs over Q\mathbb Q with abelian dynamical Galois group, improving on recent results of Andrews and Petsche. On the other hand we show that several infinite families of subgroups of the automorphism group of the infinite binary tree cannot appear as images of arboreal representations of quadratic polynomials over number fields, yielding unconditional evidence towards Jones' finite index conjecture.Comment: Sections 3 and 4 now swapped. Accepted for publication on IMR

    Blue Gravity Waves from BICEP2 ?

    Full text link
    We present new constraints on the spectral index n_T of tensor fluctuations from the recent data obtained by the BICEP2 experiment. We found that the BICEP2 data alone slightly prefers a positive, "blue", spectral index with n_T=1.36\pm0.83 at 68 % c.l.. However, when a TT prior on the tensor amplitude coming from temperature anisotropy measurements is assumed we get n_T=1.67\pm0.53 at 68 % c.l., ruling out a scale invariant nT=0n_T=0 spectrum at more than three standard deviations. These results are at odds with current bounds on the tensor spectral index coming from pulsar timing, Big Bang Nucleosynthesis, and direct measurements from the LIGO experiment. Considering only the possibility of a "red", n_T<0 spectral index we obtain the lower limit n_T > -0.76 at 68 % c.l. (n_T>-0.09 when a TT prior is included).Comment: 3 Pages, 4 Figure

    Factor Mapping and Metamodelling

    Get PDF
    In this work we present some techniques, within the realm of Global Sensitivity Analysis, which permit to address fundamental questions in term of model's understanding. In particular we are interested in developing tools which allow to determine which factor (or group of factors) are most responsible for producing model outputs Y within or outside specified bounds ranking the importance of the various input factors in terms of their influence on the variation of Y. On the other hand, we look for representing in a direct way (graphically, analytically, etc.) the relationship between input factors X_1,..., X_k and output Y in order to get a better understanding of the model itself.JRC.G.9-Econometrics and statistical support to antifrau

    A simulation approach to distinguish risk contribution roles to systemic crises

    Get PDF
    The last financial crisis has shown that large banking crises pose a highly dangerous risk to both the real economy and public finances. Reducing that risk has become a priority for regulators and governments, but the debate on how to deal with it remains open. Contagion plays a key role: domino effects can turn a relatively small difficulty into a systemic crisis. It is thus important to assess how contagion spreads across banking systems and how to distinguish the two roles played by ‘lighters’ and ‘fuel’ in the crisis, i.e. which banks are likely to start financial contagion and which have a ‘passive’ role in just being driven to default by contagion. The aim of this paper is to propose a methodology for distinguishing the two roles, and for assessing their different contributions to systemic crises. For this purpose, we have adapted a Monte Carlo simulation-based approach for banking systems which models both correlation and contagion between banks. Selecting large crises in simulations, and finding which banks started each simulated crisis, allows us to distinguish ‘primary’ and ‘induced’ defaults and fragility, and to determine the contribution of individual banks to the triggering of systemic crises. The analysis has been tested on a sample of 83 Danish banks for 2010.JRC.G.1-Scientific Support to Financial Analysi

    Adapting and Optimizing the Systemic Model of Banking Originated Losses (SYMBOL) Tool to the Multi-core Architecture

    Get PDF
    Currently, multi-core system is a predominant architecture in the computational word. This gives new possibilities to speedup statistical and numerical simulations, but it also introduce many challenges we need to deal with. In order to improve the performance metrics, we need to consider different key points as: core communications, data locality, dependencies, memory size, etc. This paper describes a series of optimization steps done on the SYMBOL model meant to enhance its performance and scalability. SYMBOL is a micro-funded statistical tool which analyses the consequences of bank failures, taking into account the available safety nets, such as deposit guarantee schemes or resolution funds. However, this tool, in its original version, has some computational weakness, because its execution time grows considerably, when we request to run with large input data (e.g. large banking systems) or if we wish to scale up the value of the stopping criterium, i.e. the number of default scenarios to be considered. Our intention is to develop a tool (extendable to other model having similar characteristics) where a set of serial (e.g. deleting redundancies, loop enrolling, etc.) and parallel strategies (e.g. OpenMP, and GPU programming) come together to obtain shorter execution time and scalability. The tool uses automatic configuration to make the best use of available resources on the basis of the characteristics of the input datasets. Experimental results, done varying the size of the input dataset and the stopping criterium, show a considerable improvement one can obtain by using the new tool, with execution time reduction up to 96 % of with respect to the original serial versionJRC.G.1-Financial and Economic Analysi
    • …
    corecore